Two Convergence Results for Continuous Descent Methods
نویسندگان
چکیده
We consider continuous descent methods for the minimization of convex functionals defined on general Banach space. We establish two convergence results for methods which are generated by regular vector fields. Since the complement of the set of regular vector fields is σ-porous, we conclude that our results apply to most vector fields in the sense of Baire’s categories.
منابع مشابه
Convergence Results for a Class of Abstract Continuous Descent Methods
We study continuous descent methods for the minimization of Lipschitzian functions defined on a general Banach space. We establish convergence theorems for those methods which are generated by approximate solutions to evolution equations governed by regular vector fields. Since the complement of the set of regular vector fields is σ-porous, we conclude that our results apply to most vector fiel...
متن کاملA new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations
In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...
متن کاملTwo Settings of the Dai-Liao Parameter Based on Modified Secant Equations
Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods, we introduce two new parameters based on the modified secant equation proposed by Li et al. (Comput. Optim. Appl. 202:523-539, 2007) with two approaches, which use an extended new conjugacy condition. The first is based on a modified descent three-term search direction, as the descent Hest...
متن کاملOn the convergence speed of artificial neural networks in the solving of linear systems
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper is a scrutiny on the application of diverse learning methods in speed of convergence in neural networks. For this aim, first we introduce a perceptron method based on artificial neural networks which has been applied for solving a non-singula...
متن کاملA Generic Convergence Theorem for Continuous Descent Methods in Banach Spaces
We study continuous descent methods for minimizing convex functions defined on general Banach spaces and prove that most of them (in the sense of Baire category) converge.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003